Optimal Soaring with Hamilton-jacobi-bellman Equations∗
نویسنده
چکیده
Competition glider flying, like other outdoor sports, is a game of stochastic optimization, in which mathematics and quantitative strategies have historically played an important role. We address the problem of uncertain future atmospheric conditions by constructing a nonlinear Hamilton-Jacobi-Bellman equation for the optimal speed to fly, with a free boundary describing the climb/cruise decision. This equation comes from a singular stochastic exit time control problem and involves a gradient constraint, a state constraint and a weak Dirichlet boundary condition. An accurate numerical solution requires robust monotone numerical methods. The computed results are of direct applicability for glider flight.
منابع مشابه
On the dynamic programming approach for the 3D Navier-Stokes equations
The dynamic programming approach for the control of a 3D flow governed by the stochastic Navier-Stokes equations for incompressible fluid in a bounded domain is studied. By a compactness argument, existence of solutions for the associated Hamilton-Jacobi-Bellman equation is proved. Finally, existence of an optimal control through the feedback formula and of an optimal state is discussed.
متن کاملHamilton-Jacobi-Bellman equations for Quantum Optimal Feedback Control
We exploit the separation of the ltering and control aspects of quantum feedback control to consider the optimal control as a classical stochastic problem on the space of quantum states. We derive the corresponding Hamilton-Jacobi-Bellman equations using the elementary arguments of classical control theory and show that this is equivalent, in the Stratonovich calculus, to a stochastic Hamilton...
متن کاملSecond Order Hamilton--Jacobi Equations in Hilbert Spaces and Stochastic Boundary Control
The paper is concerned with fully nonlinear second order Hamilton{Jacobi{Bellman{ Isaacs equations of elliptic type in separable Hilbert spaces which have unbounded rst and second order terms. The viscosity solution approach is adapted to the equations under consideration and the existence and uniqueness of viscosity solutions is proved. A stochastic optimal control problem driven by a paraboli...
متن کاملHamilton-Jacobi-Bellman equations for Quantum Filtering and Control
We exploit the separation of the filtering and control aspects of quantum feedback control to consider the optimal control as a classical stochastic problem on the space of quantum states. We derive the corresponding Hamilton-Jacobi-Bellman equations using the elementary arguments of classical control theory and show that this is equivalent, in the Stratonovich calculus, to a stochastic Hamilto...
متن کاملHamilton-Jacobi-Bellman Equations and the Optimal Control of Stochastic Systems
In many applications (engineering, management, economy) one is led to control problems for stochastic systems : more precisely the state of the system is assumed to be described by the solution of stochastic differential equations and the control enters the coefficients of the equation. Using the dynamic programming principle E. Bellman [6] explained why, at least heuristically, the optimal cos...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2004